Principal components analysis of regularly varying functions
نویسندگان
چکیده
منابع مشابه
An Extension of the Class of Regularly Varying Functions
We define a new class of positive and Lebesgue measurable functions in termsof their asymptotic behavior, which includes the class of regularly varying functions.We also characterize it by transformations, corresponding to generalized momentswhen these functions are random variables. We study the properties of this new classand discuss their applications to Extreme Value The...
متن کاملPersian Handwriting Analysis Using Functional Principal Components
Principal components analysis is a well-known statistical method in dealing with large dependent data sets. It is also used in functional data for both purposes of data reduction as well as variation representation. On the other hand "handwriting" is one of the objects, studied in various statistical fields like pattern recognition and shape analysis. Considering time as the argument,...
متن کاملFunctional principal components analysis of workload capacity functions.
Workload capacity, an important concept in many areas of psychology, describes processing efficiency across changes in workload. The capacity coefficient is a function across time that provides a useful measure of this construct. Until now, most analyses of the capacity coefficient have focused on the magnitude of this function, and often only in terms of a qualitative comparison (greater than ...
متن کاملRegularly varying probability densities
The convolution of regularly varying probability densities is proved asymptotic to their sum, and hence is also regularly varying. Extensions to rapid variation, O-regular variation, and other types of asymptotic decay are also given. Regularly varying distribution functions have long been used in probability theory; see e.g. Feller [7, VIII.8], Bingham, Goldie and Teugels [5, Ch. 8]. This note...
متن کاملOnline Principal Components Analysis
We consider the online version of the well known Principal Component Analysis (PCA) problem. In standard PCA, the input to the problem is a set of ddimensional vectors X = [x1, . . . ,xn] and a target dimension k < d; the output is a set of k-dimensional vectors Y = [y1, . . . ,yn] that minimize the reconstruction error: minΦ ∑ i ‖xi − Φyi‖2. Here, Φ ∈ Rd×k is restricted to being isometric. The...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Bernoulli
سال: 2019
ISSN: 1350-7265
DOI: 10.3150/19-bej1113